Mutual Information and Minimum Mean-Square Error in Multiuser Gaussian Channels
نویسنده
چکیده
1. Introduction Due to the lack of explicit closed form expressions of the mutual information for binary inputs, which were provided only for the BPSK and QPSK for the single input single output (SISO) case, [1], [2], [3], it is of particular importance to address connections between information theory and estimation theory for the multiuser case. Connections between information theory and estimation theory dates back to the work of Duncan, in [4] who showed that for the continuous-time additive white Gaussian noise (AWGN) channel, the filtering minimum mean squared error (causal estimation) is twice the input-output mutual information for any underlying signal distribution. Recently, Guo, Shamai, and Verdu have illuminated intimate connections between information theory and estimation theory in a seminal paper, [1]. In particular, Guo et al. have shown that in the classical problem of information transmission through the conventional AWGN channel, the derivative of the mutual information with respect to the SNR is equal to the smoothing minimum mean squared error (noncausal estimation); a relationship that holds for scalar, vector, discrete-time and continuous-time channels regardless of the input statistics. There have been extensions of these results to the case of mismatched input distributions in the scalar Gaussian channel in [5] and [6]. However, the fundamental relation between the derivative of the mutual information and the MMSE, known as I-MMSE identity, and defined for point to point channels with any noise or input distributions in [1] is not anymore suitable for the multiuser case. Therefore, in this paper, we revisit the connections between the mutual information and the MMSE for the multiuser setup. We generalize the I-MMSE relation to the multiuser case. In particular, we prove that the derivative of the mutual information with respect to the signal to noise ratio (SNR) is equal to the minimum mean squared error (MMSE) plus a covariance induced due to the interference, quantified by a term with respect to the cross correlation of the users inputs' estimates, their channels, and their precoding matrices. Further, we capitalize on this unveiled multiuser I-MMSE relation to derive the components of the multiuser mutual information. In particular, we derive the derivative of the conditinal and non-conditional mutual information with respect to the SNR. Further extensions of this result allows a generalization of the relations of linear vector Gaussian channels in [7] to multiuser channels. In particular, [8], [9] generalize the I-MMSE relation to the per-user gradient …
منابع مشابه
Gaussian Channels: Information, Estimation and Multiuser Detection
This thesis represents an addition to the theory of information transmission, signal estimation, nonlinear filtering, and multiuser detection over channels with Gaussian noise. The work consists of two parts based on two problem settings—single-user and multiuser—which draw different techniques in their development. The first part considers canonical Gaussian channels with an input of arbitrary...
متن کاملGeneralized I-MMSE for K-User Gaussian Channels
In this paper, we generalize the fundamental relation between the mutual information and the minimum mean squared error (MMSE) by Guo, Shamai, and Verdu [1] to K-User Gaussian channels. We prove that the derivative of the multiuser mutual information with respect to the signal to noise ratio (SNR) is equal to the total MMSE plus a covariance term with respect to the cross correlation of the mul...
متن کاملMultiuser I-MMSE
In this paper, we generalize the fundamental relation between the derivative of the mutual information and the minimum mean squared error (MMSE) to multiuser setups. We prove that the derivative of the mutual information with respect to the signal to noise ratio (SNR) is equal to the MMSE plus a covariance induced due to the interference, quantified by a term with respect to the cross correlati...
متن کاملMultiple Access Gaussian Channels with Arbitrary Inputs: Optimal Precoding and Power Allocation
In this paper, we derive new closed-form expressions for the gradient of the mutual information with respect to arbitrary parameters of the two-user multiple access channel (MAC). The derived relations generalize the fundamental relation between the derivative of the mutual information and the minimum mean squared error (MMSE) to multiuser setups. We prove that the derivative of the mutual info...
متن کاملCapacity and Normalized Optimal Detection Error in Gaussian Channels
For vector Gaussian channels, a precise differential connection between channel capacity and a quantity termed normalized optimal detection error (NODE) is presented. Then, this C–NODE relationship is extended to continuous-time Gaussian channels drawing on a waterfilling characterization recently found for the capacity of continuous-time linear time-varying channels. In the latter case, the C–...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015